16 research outputs found

    True-data Testbed for 5G/B5G Intelligent Network

    Full text link
    Future beyond fifth-generation (B5G) and sixth-generation (6G) mobile communications will shift from facilitating interpersonal communications to supporting Internet of Everything (IoE), where intelligent communications with full integration of big data and artificial intelligence (AI) will play an important role in improving network efficiency and providing high-quality service. As a rapid evolving paradigm, the AI-empowered mobile communications demand large amounts of data acquired from real network environment for systematic test and verification. Hence, we build the world's first true-data testbed for 5G/B5G intelligent network (TTIN), which comprises 5G/B5G on-site experimental networks, data acquisition & data warehouse, and AI engine & network optimization. In the TTIN, true network data acquisition, storage, standardization, and analysis are available, which enable system-level online verification of B5G/6G-orientated key technologies and support data-driven network optimization through the closed-loop control mechanism. This paper elaborates on the system architecture and module design of TTIN. Detailed technical specifications and some of the established use cases are also showcased.Comment: 12 pages, 10 figure

    Accelerated Structure-Aware Sparse Bayesian Learning for 3D Electrical Impedance Tomography

    Get PDF

    Learning Rate Optimization for Federated Learning Exploiting Over-the-air Computation

    Get PDF
    Federated learning (FL) as a promising edge-learning framework can effectively address the latency and privacy issues by featuring distributed learning at the devices and model aggregation in the central server. In order to enable efficient wireless data aggregation, over-the-air computation (AirComp) has recently been proposed and attracted immediate attention. However, fading of wireless channels can produce aggregate distortions in an AirComp-based FL scheme. To combat this effect, the concept of dynamic learning rate (DLR) is proposed in this work. We begin our discussion by considering multiple-input-single-output (MISO) scenario, since the underlying optimization problem is convex and has closed-form solution. We then extend our studies to more general multiple-input-multiple-output (MIMO) case and an iterative method is derived. Extensive simulation results demonstrate the effectiveness of the proposed scheme in reducing the aggregate distortion and guaranteeing the testing accuracy using the MNIST and CIFAR10 datasets. In addition, we present the asymptotic analysis and give a near-optimal receive beamforming design solution in closed form, which is verified by numerical simulations

    Link-level simulator for 5G localization

    Full text link
    Channel-state-information-based localization in 5G networks has been a promising way to obtain highly accurate positions compared to previous communication networks. However, there is no unified and effective platform to support the research on 5G localization algorithms. This paper releases a link-level simulator for 5G localization, which can depict realistic physical behaviors of the 5G positioning signal transmission. Specifically, we first develop a simulation architecture considering more elaborate parameter configuration and physical-layer processing. The architecture supports the link modeling at sub-6GHz and millimeter-wave (mmWave) frequency bands. Subsequently, the critical physical-layer components that determine the localization performance are designed and integrated. In particular, a lightweight new-radio channel model and hardware impairment functions that significantly limit the parameter estimation accuracy are developed. Finally, we present three application cases to evaluate the simulator, i.e. two-dimensional mobile terminal localization, mmWave beam sweeping, and beamforming-based angle estimation. The numerical results in the application cases present the performance diversity of localization algorithms in various impairment conditions

    Toward 6G TKμ\mu Extreme Connectivity: Architecture, Key Technologies and Experiments

    Full text link
    Sixth-generation (6G) networks are evolving towards new features and order-of-magnitude enhancement of systematic performance metrics compared to the current 5G. In particular, the 6G networks are expected to achieve extreme connectivity performance with Tbps-scale data rate, Kbps/Hz-scale spectral efficiency, and μ\mus-scale latency. To this end, an original three-layer 6G network architecture is designed to realise uniform full-spectrum cell-free radio access and provide task-centric agile proximate support for diverse applications. The designed architecture is featured by super edge node (SEN) which integrates connectivity, computing, AI, data, etc. On this basis, a technological framework of pervasive multi-level (PML) AI is established in the centralised unit to enable task-centric near-real-time resource allocation and network automation. We then introduce a radio access network (RAN) architecture of full spectrum uniform cell-free networks, which is among the most attractive RAN candidates for 6G TKμ\mu extreme connectivity. A few most promising key technologies, i.e., cell-free massive MIMO, photonics-assisted Terahertz wireless access and spatiotemporal two-dimensional channel coding are further discussed. A testbed is implemented and extensive trials are conducted to evaluate innovative technologies and methodologies. The proposed 6G network architecture and technological framework demonstrate exciting potentials for full-service and full-scenario applications.Comment: 15 pages, 12 figure

    Sea Clutter Cancellation for Passive Radar Sensor Exploiting Multi-Channel Adaptive Filters

    No full text
    corecore